Building a Raspberry Pi Robot and Controlling it with Scratch – Part 2

Welcome to the second part of our series of posts, describing the workshop we ran at the recent Digimakers event at @Bristol. In the last post we described the outline of the workshop and looked at the hardware of the Raspberry Pi robot that we built for the event. In this post we describe the software running on the robot, and how we set it up. Hopefully this post will give some useful ideas for those wanting to set up their own Raspberry Pi robot.

Overall Architecture

The post is to say the least, a bit geeky and a bit technical, so you may find your eyes glazing over a bit if you’re just reading these posts out of a general interest. Therefore, hopefully the diagram below should serve to give you an overview of the general architecture of the system, and you can just look at the relevant sections in the rest of the post for more detail if you want.

Robot Control Architecture

Robot Control Architecture

Software Installation

We began with a vanilla Raspbian install, simply because that’s what we’re most familiar with, and then installed a number of packages

sudo apt-get update
sudo apt-get install python-serial libopencv-dev cmake arduino python-dev python-pip

This gives us pySerial for communication via serial ports, OpenCV for computer vision, CMake to build code and the Arduino IDE to upload control code to the Arduino Mini Driver board that was used to control the motors of the robot. It also allows us to install the RPIO library to use the GPIO pins of the Raspberry Pi by running

sudo pip install RPIO

Getting the Robot Code

All of the robot code can be found in our Github repository here. Download it to the Raspberry Pi by running

git clone https://github.com/DawnRobotics/digimakers_scratch_workshop.git

Programming the Arduino Mini Driver

As we discussed in part 1 of this series. We could have controlled the motor drivers on the robot’s Explorer PCB using the Raspberry Pi, but we had problems getting interrupts working quickly enough on the Raspberry Pi, so that they could accurately keep track of the robot’s encoders. Therefore we used an Arduino Mini Driver board, and created a small sketch that accepted commands over a serial USB link to go forwards, go backwards, turn left etc, and then continuously sent back the tick counts for the left and right encoders.

The sketch can be found in the mini_driver_sketch folder of our code, and can be uploaded with the Arduino IDE.

Using the Camera to Recognise AR Markers

One of the goals for students attending the workshop, was to get the robot to locate an ‘alien artifact’ hidden behind some Martian rocks. For our alien artifacts we used Augmented Reality (AR) markers. These are similar to QR codes, and are essentially a lot easier for computer vision code to identify than arbitrary objects, such as a cup, or cuddly toy for example. To generate and recognise the AR markers we used Aruco, which is an excellent open source AR marker library that makes use of OpenCV.

You can make and install Aruco on your Raspberry Pi by first downloading aruco-1.2.4.tgz from here, and then running the following commands

tar xvzf aruco-1.2.4.tgz
cd aruco-1.2.4
mkdir build
cd build
cmake ../
make
sudo make install

Aruco comes with a number of example programs that you can play around with to get a feel for it. One of these is a program called aruco_create_marker which you can use to create the AR markers for the alien artifacts. We created a number of different markers for the workshop and then put them together into a PDF for easy printing.

We used a modified version of one of the Aruco test programs to detect AR markers in images taken using the raspistill program. All of this was bolted together using a Python script to take a photo, and then run the detection program to look for the AR markers. It’s a bit hacky, and could probably be done much better, but it shows the power of Python allowing you to throw stuff together when under pressure. ;)

To get the detection code to work, first enable the Raspberry Pi’s camera by running

sudo raspi-config

and choosing the ‘Enable Camera’ option. Then build the detection program by running

cd ~/digimakers_scratch_workshop/ar_marker_detector
make

You can test that the detection works by putting an AR marker in view of the camera and running

python ~/digimakers_scratch_workshop/scripts/artifact_find_test.py

Enabling Bluetooth Serial Communication

In the workshop, the Raspberry Pi robot was controlled remotely by a laptop running Scratch, using a USB Bluetooth dongle. Now, we could have used a USB Bluetooth dongle on the Raspberry Pi robot as well, but quite frankly, in the limited time we had available before the workshop, working out how to pair 2 linux computers with USB Bluetooth dongles, and setting up serial communications was beyond us. Therefore we wired up the Bluetooth serial module that we sell to the RX and TX lines of the Raspberry Pi’s GPIO header and used that UART port instead.

This meant that we needed to enable the GPIO serial port using the instructions at the bottom of this page.

Running the Control Code

At this point the robot is set up and we’re ready to run the Python control program scripts/rover_5_server.py. This control program listens continuously on the Bluetooth serial port for commands from the controlling laptop. In turn it sends commands to the Arduino Mini Driver whenever the robot needs to move, and transmits data about the robot back across the Bluetooth serial port.

You can start the control program manually by running

sudo python ~/digimakers_scratch_workshop/scripts/rover_5_server.py

However, what we really want is to have this program running every time the Raspberry Pi robot starts up. This can be done by running it as a cron job. Run

sudo crontab -e

and add the following line to the bottom of the file

@reboot python /home/pi/digimakers_scratch_workshop/scripts/rover_5_server.py &

Now, you can test that the robot works by using a terminal program such as Hyperterminal or Cutecom to talk to it. It will take a bit of time for the robot to start responding after boot up, but once it does, you should see it print out a stream of numbers. These correspond to

Left Encoder Tick, Right Encoder Tick, Ultrasonic Sensor Distance, Last AR  detection attempt number, detected AR marker Id, heading to detected AR marker

You can control the robot by sending the following commands over serial

  • f – Drive forwards
  • b – Drive backwards
  • l – Turn left
  • r – Turn right
  • s – Stop
  • u – Sound the buzzer (if a BerryClip is attached)
  • p – Try to detect an artifact

Next Time…

So, in this post we’ve looked at the software running on the Raspberry Pi robot, and gone over the main installation steps that were needed to set it up. In the next and final post in this series, we’ll look at the software needed to interface Scratch to the robot, and talk about our experience actually running the workshop…

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>